AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Long Text Understanding

# Long Text Understanding

Dolphin 2.7 Mixtral 8x7b AWQ
Apache-2.0
Dolphin 2.7 Mixtral 8X7B is a large language model based on the Mixtral architecture, focusing on code generation and instruction-following tasks.
Large Language Model Transformers English
D
TheBloke
5,839
22
Bert Large Japanese Wikipedia Ud Head Finetuned Inquiry
BERT-large model pre-trained on Japanese Wikipedia data, fine-tuned for UD head parsing tasks
Large Language Model Transformers
B
anhcanvasasia
33
0
Albert Chinese Large Qa
Apache-2.0
A pre-trained Albert large-scale Chinese Q&A model based on Baidu WebQA and Baidu DuReader datasets, suitable for Chinese Q&A tasks.
Question Answering System Transformers Chinese
A
wptoux
32
12
Albert Gpt2 Full Summarization Cnndm
News summarization model based on ALBERT and GPT2 architectures, fine-tuned on the CNN/DailyMail dataset
Text Generation Transformers
A
Ayham
15
0
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase